-
1 total entropy
полная энтропия
суммарная энтропия
—
[Л.Г.Суменко. Англо-русский словарь по информационным технологиям. М.: ГП ЦНИИС, 2003.]Тематики
Синонимы
EN
Англо-русский словарь нормативно-технической терминологии > total entropy
-
2 total entropy
полная энтропия; суммарная энтропия -
3 total entropy
Большой англо-русский и русско-английский словарь > total entropy
-
4 total entropy
1) Вычислительная техника: полная энтропия, суммарная энтропия2) Холодильная техника: общая энтропия -
5 total entropy
полная энтропия; суммарная энтропияEnglish-Russian dictionary of computer science and programming > total entropy
-
6 total entropy
-
7 total entropy
полная энтропияEnglish-Russian dictionary of technical terms > total entropy
-
8 total entropy
полная энтропия; суммарная энтропия -
9 total entropy
English-Russian dictionary of computer science > total entropy
-
10 total entropy production
English-russian dictionary of physics > total entropy production
-
11 entropy
т.инф. энтропия- character mean entropy
- conditional entropy
- entropy per a symbol
- generalized entropy
- joint entropy
- mean entropy
- negative entropy
- population entropy
- posterior entropy
- relative entropy
- Shannon entropy
- total entropy
- unconditional entropyEnglish-Russian dictionary of computer science and programming > entropy
-
12 entropy
character mean entropy средняя энтропия на знак entropy энтропия joint entropy общая энтропия population entropy энтропия совокупности total entropy полная энтропия unconditional entropy безусловная энтропия -
13 entropy
negative entropy — отрицательная энтропия; негэнтропия
total entropy — полная энтропия; суммарная энтропия
-
14 entropy
(физическое) энтропия character mean ~ средняя энтропия на знак entropy энтропия joint ~ общая энтропия population ~ энтропия совокупности total ~ полная энтропия unconditional ~ безусловная энтропия -
15 character mean entropy
negative entropy — отрицательная энтропия; негэнтропия
total entropy — полная энтропия; суммарная энтропия
-
16 generalized entropy
negative entropy — отрицательная энтропия; негэнтропия
total entropy — полная энтропия; суммарная энтропия
-
17 relative entropy
negative entropy — отрицательная энтропия; негэнтропия
total entropy — полная энтропия; суммарная энтропия
-
18 Shannon entropy
negative entropy — отрицательная энтропия; негэнтропия
total entropy — полная энтропия; суммарная энтропия
-
19 isothermal entropy
negative entropy — отрицательная энтропия; негэнтропия
total entropy — полная энтропия; суммарная энтропия
English-Russian dictionary on nuclear energy > isothermal entropy
-
20 specific entropy
negative entropy — отрицательная энтропия; негэнтропия
total entropy — полная энтропия; суммарная энтропия
English-Russian dictionary on nuclear energy > specific entropy
- 1
- 2
См. также в других словарях:
Entropy — This article is about entropy in thermodynamics. For entropy in information theory, see Entropy (information theory). For a comparison of entropy in information theory with entropy in thermodynamics, see Entropy in thermodynamics and information… … Wikipedia
Entropy (statistical thermodynamics) — In thermodynamics, statistical entropy is the modeling of the energetic function entropy using probability theory. The statistical entropy perspective was introduced in 1870 with the work of the Austrian physicist Ludwig Boltzmann. Mathematical… … Wikipedia
entropy — entropic /en troh pik, trop ik/, adj. entropically, adv. /en treuh pee/, n. 1. Thermodynam. a. (on a macroscopic scale) a function of thermodynamic variables, as temperature, pressure, or composition, that is a measure of the energy that is not… … Universalium
Entropy (information theory) — In information theory, entropy is a measure of the uncertainty associated with a random variable. The term by itself in this context usually refers to the Shannon entropy, which quantifies, in the sense of an expected value, the information… … Wikipedia
Entropy (arrow of time) — Entropy is the only quantity in the physical sciences that picks a particular direction for time, sometimes called an arrow of time. As one goes forward in time, the second law of thermodynamics says that the entropy of an isolated system can… … Wikipedia
Entropy (energy dispersal) — The thermodynamic concept of entropy can be described qualitatively as a measure of energy dispersal (energy distribution) at a specific temperature. Changes in entropy can be quantitatively related to the distribution or the spreading out of the … Wikipedia
Entropy of mixing — The entropy of mixing is the change in the configuration entropy, an extensive thermodynamic quantity, when two different chemical substances or components are mixed. This entropy change must be positive since there is more uncertainty about the… … Wikipedia
Total correlation — In probability theory and in particular in information theory, total correlation (Watanabe 1960) is one of several generalizations of the mutual information. It is also known as the multivariate constraint (Garner 1962) or multiinformation… … Wikipedia
Entropy (classical thermodynamics) — In thermodynamics, entropy is a measure of how close a thermodynamic system is to equilibrium. A thermodynamic system is any physical object or region of space that can be described by its thermodynamic quantities such as temperature, pressure,… … Wikipedia
entropy — (en tro pe) A measure of the randomness or disorder of a system; a measure of that part of the total energy in a system that is unavailable for useful work … Dictionary of microbiology
Joint entropy — The joint entropy is an entropy measure used in information theory. The joint entropy measures how much entropy is contained in a joint system of two random variables. If the random variables are X and Y, the joint entropy is written H(X,Y). Like … Wikipedia